Why Not Run the Efficient Global Optimization Algorithm with Multiple Surrogates?

نویسندگان

  • Felipe A. C. Viana
  • Raphael T. Haftka
چکیده

Surrogate-based optimization has become popular in the design of complex engineering systems. Each optimization cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally performing exact simulation at the design obtained by the optimization. Adaptive sampling algorithms that add one point per cycle are readily available in the literature. They use uncertainty estimators to guide the selection of the next sampling point(s). The addition of one point at a time may not be efficient when it is possible to run simulations in parallel. So we propose an algorithm for adding several points per optimization cycle based on the simultaneous use of multiple surrogates. The need for uncertainty estimates usually limits adaptive sampling algorithms to surrogates such as kriging and polynomial response surface because of the lack of uncertainty estimates in the implementation of other surrogates. We import uncertainty estimates from surrogates having such estimates to use with other surrogates such as support vector regression models. The approach was tested on two analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard and support vector regression. For these examples we compare our approach with traditional sequential optimization based on kriging. We found that our approach was able to deliver better results in a fraction of the optimization cycles needed by the traditional kriging implementation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient global optimization algorithm assisted by multiple surrogate techniques

Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) al...

متن کامل

Modified Sine-Cosine Algorithm for Sizing Optimization of Truss Structures with Discrete Design Variables

This paper proposes a modified sine cosine algorithm (MSCA) for discrete sizing optimization of truss structures. The original sine cosine algorithm (SCA) is a population-based metaheuristic that fluctuates the search agents about the best solution based on sine and cosine functions. The efficiency of the original SCA in solving standard optimization problems of well-known mathematical function...

متن کامل

Finding efficient frontier of process parameters for plastic injection molding

Product quality for plastic injection molding process is highly related with the settings for its process parameters. Additionally, the product quality is not simply based on a single quality index, but multiple interrelated quality indices. To find the settings for the process parameters such that the multiple quality indices can be simultaneously optimized is becoming a research issue and ...

متن کامل

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

Efficient optimization of the likelihood function in Gaussian process modelling

Gaussian Process (GP) models are popular statistical surrogates used for emulating computationally expensive computer simulators. The quality of a GP model fit can be assessed by a goodness of fit measure based on optimized likelihood. Finding the global maximum of the likelihood function for a GP model is typically challenging, as the likelihood surface often has multiple local optima, and an ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010